Boosting capuchin search with stochastic learning strategy for feature selection

نویسندگان

چکیده

Abstract The technological revolution has made available a large amount of data with many irrelevant and noisy features that alter the analysis process increase time processing. Therefore, feature selection (FS) approaches are used to select smallest subset relevant features. Feature is viewed as an optimization for which meta-heuristics have been successfully applied. Thus, in this paper, new approach proposed based on enhanced version Capuchin search algorithm (CapSA). In developed FS approach, named ECapSA, three modifications introduced avoid lack diversity, premature convergence basic CapSA: (1) inertia weight adjusted using logistic map, (2) sine cosine acceleration coefficients added improve convergence, (3) stochastic learning strategy add more diversity movement levy random walk. To demonstrate performance different datasets used, it compared other well-known methods. results provide evidence superiority ECapSA among tested competitive methods terms metrics.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Improved K-Nearest Neighbor with Crow Search Algorithm for Feature Selection in Text Documents Classification

The Internet provides easy access to a kind of library resources. However, classification of documents from a large amount of data is still an issue and demands time and energy to find certain documents. Classification of similar documents in specific classes of data can reduce the time for searching the required data, particularly text documents. This is further facilitated by using Artificial...

متن کامل

An Improved K-Nearest Neighbor with Crow Search Algorithm for Feature Selection in Text Documents Classification

The Internet provides easy access to a kind of library resources. However, classification of documents from a large amount of data is still an issue and demands time and energy to find certain documents. Classification of similar documents in specific classes of data can reduce the time for searching the required data, particularly text documents. This is further facilitated by using Artificial...

متن کامل

A dependency-based search strategy for feature selection

Feature selection has become an increasingly important field of research. It aims at finding optimal feature subsets that can achieve better generalization on unseen data. However, this can be a very challenging task, especially when dealing with large feature sets. Hence, a search strategy is needed to explore a relatively small portion of the search space in order to find ”semi-optimal” subse...

متن کامل

Stochastic Attribute Selection Committees withMultiple Boosting : Learning More

Classiier learning is a key technique for KDD. Approaches to learning classiier committees, including Boosting, Bagging, Sasc, and SascB, have demonstrated great success in increasing the prediction accuracy of decision trees. Boosting and Bagging create diierent classiiers by modifying the distribution of the training set. Sasc adopts a diierent method. It generates committees by stochastic ma...

متن کامل

Boosting Soft-Margin SVM with Feature Selection for Pedestrian Detection

We present an example-based algorithm for detecting objects in images by integrating component-based classifiers, which automaticaly select the best feature for each classifier and are combined according to the AdaBoost algorithm. The system employs a soft-margin SVM for the base learner, which is trained for all features and the optimal feature is selected at each stage of boosting. We employe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neural Computing and Applications

سال: 2023

ISSN: ['0941-0643', '1433-3058']

DOI: https://doi.org/10.1007/s00521-023-08400-8